Here's a selection of my work 👇 Please also look at my Advanced Computer Science MSc at Keele University on my Github account for some of the technical coding stuff I've done. There's experience of Python, Java, JavaScript, PHP. Also spent three months learning Tensorflow to implement Generative Adverarial Networks to detect fake news.
In 2012, I was employed as a data journalist and learned Python in order to develop my slicing and dicing skills. I've found it a useful glue language which has allowed to acquire data in various types - CSV, XML, JSON etc. using web-scraping - as well as conventional sources such as spreadsheets or APIs.
Previously I spent 20 years in journalism. I specialised in business journalism - eventually becoming business editor - before moving on to become a web publisher. During that time I became interested in the potential of data and in 2012 was given chance to become a data journalist and analyst.
The Commonwealth War Graves Commission provides free access to the records of thousands of service personnel who have lost their lives in conflict. This project, started in 2012, allowed readers to explore a map showing the final memorial for up to 13,000 casualties from the North and South Staffordshire Regiments.
The final work involved significant effort to clean and visualise the data.
The STAT19 dataset from the Department of Transport provides a fascinating and rich dataset of millions of traffic accidents dating back to the 1970s. I have used it to explore trends accidents over time, at particular hotspots and using technology to bring them to life.
A very early attempt to use Python and Tableau to assess the work of Stoke-on-Trent's greatest pop export, Robbie Williams in 2014. I used Python to scrape lyrics from a website from his work - which missed out Angels. I then used NLTK (a module that does natural langauge process in python) to calculate the positive, negative sentiment of those lyrics. I then plotted the data using Tableau with X axis as time, and Y as chart position. The colour and sentiment were based on the sentiment. It is a glorious failure from my early days with Tableau.
I have made extensive use of the crime statistics released by the Home Office both to provide snapshots of crime across North Staffordshire, as well as visualising trends over time (see below for August 2014). This map has a selector so it is possible to find out what crimes happen where.
The price-paid house prices dataset provides plenty of opportunities for plotting trends of house prices, both at national and local level, and can be combined with earnings data to consider the issue of affordability.
This chart took an idea from the Telegraph newspaper about the most expensive property postcodes in the UK. This chart looks at the most affordable in North Staffordshire and South Cheshire. The legend shows the percentage of the people in the area that could afford to pay a mortgage at 4.5 times median price of homes available as compared to median earnings.
Results for Key Stage 4 in 2014.
A Freedom of Information request provided details of more than 100,000 parking tickets issued by Stoke-on-Trent City Council over five years. The map provided a helpful way of being able to highlight hotspots as well as trends on particular streets across the city.
A retired miner compiled a list of 4,715 men who lost their lives in the pits of North Staffordshire. I cleaned the manually-entered data with reference texts to provide an insight into the changing conditions for those who spent up to 12 hours underground extracting coal to power the region's pottery and steel industries.
Click the link if the content does not load.